04. extended intelligences

reflection

The seminar was an interesting continuation of the previous module where we understood the behind the scenes implications of ai use, ending with a brief introduction to the different tools and datasets available. This time extending the theory into a more tangible application of the technology exploring how it can be implemented into objects. The combination of AI engines and barduino programming gave an insight into the possibility of extending the way we use artifacts to collect and interpret data.

The first exercise using DOTTOD camera application provided a quick and easy understanding of the steps required for the machine to receive input, analice it and through prompt crafting from the user, then generate an interpretation of the original information. From a personal point of view it was important to understand the process and restrictions it involves, like planning ahead the input that is going to be fed to the tool in order to allow the best possible analisis. It became clear that in order to make the best of this type of tool the input given to it must be intentionally produced in all its elements knowing what the machine is able to take from it. If in the alternative scenario where the input cannot be intentional, for example a previously taken photograph, then the other relevant learning from with this exercise comes into play. This second aspect is also from a personal opinion the most relevant aspect of the exercise, prompt engineering.

The second exercise proved to be confirmation of this previous conclusion. Prompt engineering is essential for an optimal use of these tools. From the little experience and understanding I personally have in the subject I would go as far as suggesting that half of the job of generative ai tools is done in the prompt definition. The experience using the Modmatrix interface added a few steps to the previous exercise and allowed further exploration on how to work collaboratively with the machine to produce an outcome. It was interesting to explore how to collaborate with the tool in making it understand the desired analysis of the given input and then crafting an output.

Centered Image

The seminar concluded with a final project proposal to apply a combination of arduino programing and generative ai to an artifact. The concept developed by the group for this exercise was an exploration of how we can communicate with a given space through light. The idea being to experiment with how the machine can perceive the mood in the room and then communicate this perception through an emotion in the shape of a melody.

For this experiment taking into consideration time limitations, it was decided that the way the machine would perceive the space would be through light and temperature, two sensors already available in the barduino board. Throughout the process the complexity of the “simple task” we chose became evident. Resolution not only in the reduction of the melody outcome to a simple series of bips, but also in having to reduce the amount of input feeded to the machine to just light sensing.

The result was not the expected. In the end the artifact wasn't able to translate the characteristics of the light parameters into any sort of coherent response. The groups conclusion was that, since the barduino programing and circuits seem to be correctly made, the problem lied on the prompt given to the ai tool to produce an output.

To highlight my personal contribution to the project I must clarify that my nonexistent arduino skills were shadowed by my teammates’, and so I can say that the most of it came in the concept ideation part and collectively tempering with the prompt given to the machine. Which failed.

Nevertheless, this project gave a very interesting insight of the capabilities and applications of mixed uses of these technologies. The artifact developed is now under consideration to be adapted to the main project of the master in order to become an alternative to sense a specific space and define its characteristics. As far as the relevance this experience brought to the masters project, one aspect is the possibility to use these tools to re-interpret data and project it in a sort of “impartial” way by “collaborating” with them.